NetApp could corner AI infrastructure at Insight 2024 – but customer hype will ride on its messaging
NetApp has made a raft of AI-minded updates to its product line within the last year, leaning into hybrid and multi-cloud models as the future of enterprise IT infrastructure.
Last year’s NetApp Insight set out to show how the firm has been an early trailblazer in the AI-ready infrastructure market, seeking to lead the way in helping businesses transform their data architecture for generative AI applications.
But with NetApp Insight 2024, its annual conference, just around the corner it will need to build on this to make an effective argument for its place among competitors.
The data management powerhouse was recognized at the forefront of the cloud storage space in 2023, after it was named in the Gartner Magic Quadrant as the top provider for primary storage.
NetApp also scored the highest in Gartner’s Hybrid Cloud IT Operations & Container Use Case in its Critical Capabilities for Primary Storage. A large part of the company’s value proposition is that it offers native integrations with all three hyperscalers, making it an easy choice for many businesses.
At Insight 2023 in Las Vegas, NetApp announced Vertex AI support for Google Cloud NetApp Volumes, promising to help businesses unlock further value from their unstructured data.
Harvinder Bhela, chief product officer at NetApp boasted this put the firm in a unique position, with no other data management company enjoying this level of compatibility. telling Insight 2023 attendees, “the hyperscalers don’t do this with anybody else”.
This flexibility feeds into NetApp’s hybrid and multi-cloud strategy, which it believes is essential for any organization looking to harness the true power of AI.
Speaking to ITPro at last year’s conference, George Kurian, CEO at NetApp, argued a hybrid cloud approach would be the only way for businesses to capitalize on generative AI.
Kurian acknowledged that initially, businesses more hesitant to dip their toes into generative AI would be drawn to the public cloud. He specifically noted the safety net of security and availability assurances hyperscalers can offer.
But in the long term, Kurian has bet on hybrid cloud as be the go-to model for hosting generative AI workloads, due to their inherent flexibility and cost efficiency.
A key point of focus this year will be whether he’s been proved right yet. For customers to decide this, they’ll need to hear testimony from end-users about how their adoption of a hybrid, multi-cloud model has helped them derive value from generative AI.
What could be even more important is hearing from the right mixture of customers. A spread that includes customers of the big three hyperscalers and those from more risk-averse, or AI-hesistant, sectors would help NetApp demonstration of the scale and flexibility of its offerings.
Maintaining infrastructure momentum
NetApp Insight 2024 has to stay the course and maintain the trajectory set at last year’s conference, which laid the stage for a raft of announcements centered on helping businesses modernize their data infrastructure. This was to avoid becoming what Kurian called “data laggers”, a term that may well be revisited this year as NetApp seeks to contrast its warnings with evidence of how it can help customers..
News last year included the latest version of its data management software ONTAP featuring enhanced AI-driven analytics, improved automation, and better efficiency for, you guessed it, hybrid cloud environments. The company also announced expansions of many of its flash storage options including its AFF C-Series capacity product, which will be integrated into its ONTAP AI architecture.
The firm also launched NetApp Cloud Data Sense, a governance dashboard to help businesses map, identify, and report on both structured and unstructured data in their cloud or on-premise infrastructure.
All of this will need to be given added context this year, with any new announcements tied into a cohesive vision for NetApp’s platform. If its strategy of going hard on hybrid and multi-cloud to enable AI innovation are to resonate with customers, NetApp needs a particularly strong through-line at Insight 2024 that shows how all of its announcements complement one another.
Fresh off last year’s strong showing, it can double down on how its infrastructure is synonymous with AI-preparedness. It’s just a question of how well it lands this message.
So far, NetApp’s approach appears to be paying off. The firm outperformed market expectations with its financial results across the first quarter of 2025, reporting a big jump in subscription-based sales and noting its annual recurring revenue from all-flash storage arrays rose from $2.8 billion the previous year to $3.5 billion by the end of the period.
Importantly for its messaging, NetApp’s hybrid cloud revenues totalled $1.38 billion, up from $1.28 billion a year earlier.
Kurian welcomed the results, stating the firm had started the fiscal year on a high note and explaining the results were a testament to the firm’s ‘unwavering confidence’ in the benefits businesses can unlock using its intelligent data infrastructure platform.
“I am confident in our ability to capitalize on this momentum, as we address new market opportunities, extend our leadership position in existing markets, and deliver increasing value for all our stakeholders,” he added.
Enterprise interest in AI does not appear to be slowing down and NetApp is in a strong position to capitalize on continued growth in AI demand, particularly when it comes to deploying the data infrastructure to facilitate new workloads.
We can expect to see NetApp announce further integrations and compatibility features to demonstrate why it is the ideal partner for enterprises in the midst of its hybrid, multi-cloud transition.
Ultimately, NetApp Insight 2024 will give the company a much-needed opportunity to catch up on how it plans to capitalize on its ‘AI advantage’. If it can draw on its deep ties with hyperscalers and rest on customer stats to shore up its position as the go-to storage and management option, it can establish itself as one of the default cogs in the AI machine.
Source link